A constrained regularization approach for input-driven recurrent neural networks

نویسندگان

  • R. Felix Reinhart
  • Jochen J. Steil
چکیده

We introduce a novel regularization approach for a class of inputdriven recurrent neural networks. The regularization of network parameters is constrained to reimplement a previously recorded state trajectory. We derive a closed-form solution for network regularization and show that the method is capable of reimplementing harvested dynamics. We investigate important properties of the method and the regularized networks and show that the regularization improves task-specific generalization on a combined prediction and non-linear sequence transduction task. The approach has strong theoretical and practical implications.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Constraint optimization of input-driven recurrent neural networks

We introduce a novel constraint optimization approach for a class of input-driven recurrent neural networks. A unified network model allows algebraic derivation of optimal network parameters using the Lagrange multiplier method. Regularization of weights serves as optimality criterion, while the solution is constraint to implement given network state dynamics. We derive the analytical, closed f...

متن کامل

Suprisal-Driven Zoneout

We propose a novel method of regularization for recurrent neural networks called suprisal-driven zoneout. In this method, states zoneout (maintain their previous value rather than updating), when the suprisal (discrepancy between the last state’s prediction and target) is small. Thus regularization is adaptive and input-driven on a per-neuron basis. We demonstrate the effectiveness of this idea...

متن کامل

Multi-Step-Ahead Prediction of Stock Price Using a New Architecture of Neural Networks

Modelling and forecasting Stock market is a challenging task for economists and engineers since it has a dynamic structure and nonlinear characteristic. This nonlinearity affects the efficiency of the price characteristics. Using an Artificial Neural Network (ANN) is a proper way to model this nonlinearity and it has been used successfully in one-step-ahead and multi-step-ahead prediction of di...

متن کامل

Tikhonov Regularization for Long Short-Term Memory Networks

It is a well-known fact that adding noise to the input data often improves network performance. While the dropout technique may be a cause of memory loss, when it is applied to recurrent connections, Tikhonov regularization, which can be regarded as the training with additive noise, avoids this issue naturally, though it implies regularizer derivation for different architectures. In case of fee...

متن کامل

Optimized Combination, Regularization, and Pruning in Parallel Consensual Neural Networks

Optimized combination, regularization, and pruning is proposed for the Parallel Consensual Neural Networks (PCNNs) which is a neural network architecture based on the consensus of a collection of stage neural networks trained on the same input data with different representations. Here, a regularization scheme is presented for the PCNN and in training a regularized cost function is minimized. Th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011